BK-SAD: A Large Scale Dataset for Student Activity Recognition

Authors: Trinh Cong Dong, Nguyen Van Giang*, Nguyen Chan Hung, Nguyen Quang Dich
https://doi.org/10.51316/jst.168.ssad.2023.33.3.3

Abstract

Skeleton-based human action recognition has emerged as a prominent research topic in the field of artificial intelligence due to its broad applicability in a wide range of domains, including but not limited to healthcare, security and surveillance, entertainment, and intelligent environments. In this paper, we propose a novel data collection methodology and present BK-Student Activity Dataset (BK-SAD), a new 2D dataset for student activity recognition in smart classrooms that outperforms the existing NTU RGB+D 120 dataset, SBU Kinect Interaction dataset. Our dataset contains three classes: hand raising, doze off, and normal activities. The dataset was collected using cameras placed in the real classroom environments and consisted of video data from multiple viewpoints. The dataset contains over 2700 videos of students raising their hands, over 1700 videos of students dozing off during class, and over 8500 videos of normal activities. In addition, to evaluate the effectiveness of the proposed dataset, we give some baseline performance figures for neural network architectures trained and tested for student activity recognition on BK-SAD dataset. These ConvNet architectures demonstrate significant performance improvement on the proposed dataset. The effectiveness of the proposed novel data collection methodology and BK-SAD dataset in this paper will enable further research and development of activity recognition models for classroom environments, with potential applications in the smart education and intelligent classroom management systems. BK-SAD is available at https://visedu.vn/en/bk-sad-dataset

Keyword

Dataset, Action recognition, Skeleton pose, Student activity recognition, Smart classroom
Pages : 16-23

Related Articles:

Authors : Minh Do Duc, Khanh Nguyen Viet, Thanh Cao Duc, Hieu Ho Thanh, Duc Duong Minh, Lam Nguyen Tung*
Authors : Nhat-Minh Le Phan, Tung Thanh Nguyen, Duong Tung Nguyen, Nga Thi-Thuy Vu*